Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network

نویسندگان

  • Xiaohui Xie
  • H. Sebastian Seung
چکیده

Backpropagation and contrastive Hebbian learning are two methods of training networks with hidden neurons. Backpropagation computes an error signal for the output neurons and spreads it over the hidden neurons. Contrastive Hebbian learning involves clamping the output neurons at desired values and letting the effect spread through feedback connections over the entire network. To investigate the relationship between these two forms of learning, we consider a special case in which they are identical: a multilayer perceptron with linear output units, to which weak feedback connections have been added. In this case, the change in network state caused by clamping the output neurons turns out to be the same as the error signal spread by backpropagation, except for a scalar prefactor. This suggests that the functionality of backpropagation can be realized alternatively by a Hebbian-type learning algorithm, which is suitable for implementation in biological networks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Seeing is believing: Contrastive Hebbian Clustering for unsupervised one-shot gameplay learning in a recurrent neural network

In this paper we address two desirable abilities that are fundamental to human cognition but remain challenging for machine learning systems: the ability to learn in an unsupervised fashion and recall experiences that have only been seen once. We present a simple training algorithm for a recurrent neural network that simultaneously deals with both issues. It avoids backpropagation entirely and ...

متن کامل

Towards deep learning with spiking neurons in energy based models with contrastive Hebbian plasticity

In machine learning, error back-propagation in multi-layer neural networks (deep learning) has been impressively successful in supervised and reinforcement learning tasks. As a model for learning in the brain, however, deep learning has long been regarded as implausible, since it relies in its basic form on a non-local plasticity rule. To overcome this problem, energy-based models with local co...

متن کامل

Contrastive Hebbian Learning in the Continuous Hopfield Model

This pape.r shows that contrastive Hebbian, the algorithm used in mean field learning, can be applied to any continuous Hopfield model. This implies that non-logistic activation functions as well as self connections are allowed. Contrary to previous approaches, the learning algorithm is derived without considering it a mean field approximation to Boltzmann machine learning. The paper includes a...

متن کامل

Sequential Learning in Distributed Neural Networks without Catastrophic Forgetting: A Single and Realistic Self-Refreshing Memory Can Do It

− In sequential learning tasks artificial distributed neural networks forget catastrophically, that is, new learned information most often erases the one previously learned. This major weakness is not only cognitively implausible, as human gradually forget, but disastrous for most practical applications. An efficient solution to catastrophic forgetting has been recently proposed for backpropaga...

متن کامل

Is VLSI Neural Learning Robust against Circuit Limitations?

An investigation is made of the tolerance of various in-circuit learning algorithms to component imprecision and other circuit limitations in artificial neural networks. Supervised learning mechanisms including backpropagation and contrastive Hebbian leaning, and unsupervised soft competitive learning are all shown to be tolerant of those levels of arithmetic inaccuracy, noise, nonlinearity, we...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 15 2  شماره 

صفحات  -

تاریخ انتشار 2003